Transcript Origination Notice:

Transcriptions are machine-generated and may not have been proofread or corrected. Transcriptions are reference, search and assistive in nature only and are NOT an official transcript of this video

00:00:00.000 -- Sure, all right, well welcome everybody.
00:00:00.000 -- I'm assuming we will have some more folks join and I know people will need to drop off these are these are longer-ish meetings for the moment.
00:00:10.000 -- So welcome to the first public government and public sector efficiency subcommittee meeting.
00:00:17.000 -- These meetings are recorded and streamed on TVW so I'll be sharing when we do regular update the email updates what the link is so folks can go back or folks can share it for riveting viewing with friends and family.
00:00:33.000 -- I'm sorry, I've seen way too much of myself lately and we'll be sharing notes etc.
00:00:41.000 -- If you are not currently receiving email updates, I think the last one was sent late last week, but if you haven't received an email update from our office, if you could put your name and contact information into the chat and we'll make sure you're added to the email list so that you do receive those going forward.
00:01:01.000 -- And then we've been keeping these first subcommittee meetings relatively open and kind of broad discussion to learn what folks are thinking what concerns people have kind of so that we can have a better idea of where we're starting this conversation we're engaging and then anticipating.
00:01:21.000 -- So, as I got you.
00:01:25.000 -- So we can kind of come in next month when we and get a regular schedule and have a clear agenda.
00:01:33.000 -- So these first these first meetings are a bit free ranging and I know folks are interested in structure and we will we will get that for you.
00:01:33.000 -- We're just trying to assess where we're where we're starting from.
00:01:46.000 -- So we'll do quick introductions and I'll start with the agio team and then share in written and we'll go through all of the other attendees so I'm Ellen policy manager at the agio and I've talked enough, so my.
00:02:03.000 -- afternoon improvement excuse me.
00:02:03.000 -- Good afternoon everyone my name is my own son also a number of policy team and today's meeting I'll just be stepping in whenever necessary and also compiling minutes that will be sent out later on in the week.
00:02:21.000 -- Hi, I'm you key she's Eka I made future member of the agio policy team I've been appointed to work on Ellen's team to support the work of the task force, but my start date is Friday so in the entire interim I'm going to listen and kind of catch up and looking for a due from with anybody.
00:02:45.000 -- You can get bonus points before he even starts work super thrilled this is an amazing team really really awesome share it go ahead.
00:02:54.000 -- Yeah, hi everyone, my name is Sherry Sawyer I use she her pronouns and I'm the deputy director for policy and outreach over in governor in his office and I have to have technology issues like AI and my policy portfolio.
00:03:07.000 -- Thanks sharing and Rick.
00:03:10.000 -- Good afternoon I'm a rich cow I'm a senior project advisor for the state others office and I apologize for not showing my screen but I'm having the summer of Wi-Fi madness here so.
00:03:24.000 -- I apologize for that.
00:03:26.000 -- But I'm.
00:03:28.000 -- Or did the conversation.
00:03:30.000 -- Thanks Rick and hope we can get that that's streamed out I know those issues seem to come in waves hopefully are close to the end.
00:03:39.000 -- All right, Magna would you like to go next.
00:03:42.000 -- Sure, I'm Magna ballazin's kind of the director of the apology Allen school of computer science and engineering at the University of Washington also a professor in the school and I'm on the task force.
00:03:53.000 -- I'm gonna make.
00:03:56.000 -- Good afternoon I'm the student.
00:03:59.000 -- I'm the student.
00:04:01.000 -- I have about eight to what programs with my portfolio one of them is emerging technology are intelligence.
00:04:08.000 -- So for the purpose of this.
00:04:10.000 -- I along with the two private to be our full meeting AI community practice.
00:04:17.000 -- I'm a public sector.
00:04:19.000 -- We have.
00:04:20.000 -- State local higher education.
00:04:22.000 -- I'll be very involved in the next meeting of our institute executive order.
00:04:27.000 -- Focus on that.
00:04:28.000 -- Thank you.
00:04:31.000 -- Steve.
00:04:33.000 -- Hi there.
00:04:34.000 -- I'm Steve Zersky I live in this homey area.
00:04:36.000 -- I have a career where I've served as an in-house general council for a number of global.
00:04:41.000 -- and domestic corporations over the years and I was very privileged to have former governor Gary Locke support me for an appointment to a US commerce department advisory committee involving network security and cyber security, which was a quite a privilege of many years of doing that.
00:04:59.000 -- So my background is in network security cyber security.
00:05:03.000 -- Since the days of Adam and he so long time as the corporate general counsel.
00:05:08.000 -- Thanks Steve.
00:05:08.000 -- And just so nobody thinks I'm giving side.
00:05:08.000 -- I'm my notes are on the screen over here.
00:05:08.000 -- I feel like it's always a sort of an awkward thing.
00:05:08.000 -- So I'm very much listening.
00:05:18.000 -- All right, Boes.
00:05:20.000 -- Hi, I'm Boes Ashkenazi.
00:05:20.000 -- I am the CEO of augmented AI company based here in Seattle with offices in New York, former meta employee.
00:05:31.000 -- I worked on internal productivity tools for the employees in the company and also a board of regions Seattle Chamber of Commerce.
00:05:40.000 -- And so super interested in in public policy and we're doing a lot of work with counties and on the local level.
00:05:40.000 -- So I have a lot of interest in the way that AI fits operationally into different municipalities.
00:05:54.000 -- Great.
00:05:55.000 -- Great to hear.
00:05:56.000 -- Rory.
00:05:58.000 -- Thanks, Ellen.
00:05:58.000 -- Hi, everyone.
00:05:58.000 -- I'm Roy Paine Donovan and I serve as deputy legislative director here at the attorney general's office and just interested in hearing the conversation today and potentially working on future legislative concepts should the task force identify some.
00:05:58.000 -- So great to be here.
00:06:19.000 -- Thanks, Marley.
00:06:20.000 -- All right, Marty.
00:06:25.000 -- All right, everyone.
00:06:25.000 -- My name is Marty Maxwell and I am the privacy officer for the Department of Children, Youth and families.
00:06:33.000 -- I'm here today because I'm very, very interested in how AI can enhance our services to our constituents and our clients and excited to see what this is all going to go.
00:06:46.000 -- Thank you, Marty.
00:06:46.000 -- Cliff.
00:06:50.000 -- Hi, that wasn't quite camera ready, but that's okay.
00:06:50.000 -- I'm Cliff Magnus.
00:06:50.000 -- I'm the privacy manager at the Washington Health Benefits Exchange.
00:06:50.000 -- So we cross private and public sector for administration of the Affordable Care Act in Washington.
00:07:04.000 -- Thanks, Claire.
00:07:06.000 -- All right, Julia.
00:07:17.000 -- Shall we are you able to do an introduction? I was planning to listen in.
00:07:25.000 -- Hi, everyone.
00:07:25.000 -- I'm Sylvia and I'm with the AG's office.
00:07:30.000 -- Thanks, Julia.
00:07:34.000 -- Farrah.
00:07:43.000 -- Mike is out.
00:07:43.000 -- Okay.
00:07:43.000 -- Farrah has been joining all of the the conversation so far and has a lot of really great things that they've been having in the chat.
00:07:43.000 -- Latoya.
00:07:57.000 -- Hi, good afternoon, everyone.
00:07:57.000 -- I'm going to be off camera for the moment.
00:07:57.000 -- My name is Toy Belgrade Francis.
00:07:57.000 -- I'm coming from put orchard in Kitsap County.
00:08:07.000 -- I kind of represent, you know, kind of the citizen.
00:08:13.000 -- There's a complete of this area.
00:08:16.000 -- I'm an independent consultant with the background in my profession as licensed clinical social work worked in health care systems and different levels of the government.
00:08:30.000 -- I'm a lot of the people who are in health care.
00:08:33.000 -- One of the things that I like to do is help people with understanding what AI integration into the infrastructure looks like specifically for businesses of all sizes.
00:08:46.000 -- I'm also doing some work here as far as along the lines with the task force is doing.
00:08:46.000 -- Here in Kitsap County regarding cyber security response.
00:09:00.000 -- You know, just the general security efficiency and effective use of AI.
00:09:06.000 -- It looks like for Kitsap County.
00:09:06.000 -- So I'm here to help in any way that I can.
00:09:13.000 -- And making sure that people are asking the right questions so they can get the right answers when it comes to how AI is going to be transforming our future right now today.
00:09:25.000 -- So thank you.
00:09:27.000 -- Thanks, Toya Samantha.
00:09:35.000 -- I'm the enterprise IT policy manager at Watek work with Nick.
00:09:41.000 -- And I'm really just here.
00:09:42.000 -- I'm interested to hear about what people are thinking and about government in the eye.
00:09:42.000 -- So.
00:09:49.000 -- Thanks.
00:09:50.000 -- Let me step in.
00:09:50.000 -- Thanks for joining us.
00:09:55.000 -- Also, and Leslie.
00:10:05.000 -- Hi, I'm Leslie Clark.
00:10:05.000 -- I am from what community college in Bellingham and I work in college administration.
00:10:15.000 -- I'm, but I'm not here as an official representative of the college just as an interested person.
00:10:22.000 -- I am interested in helping the college develop and acceptable use policy for college employees.
00:10:30.000 -- So that's my main interest in this subcommittee.
00:10:30.000 -- Thanks.
00:10:36.000 -- Great.
00:10:36.000 -- Thank you so much.
00:10:36.000 -- Glad everybody is able to be here.
00:10:36.000 -- I think I got everybody.
00:10:44.000 -- I wonder.
00:10:44.000 -- So what we've been doing in these first meetings is a broad discussion of just kind of what's top of mind for you concerns, hopes, dreams, worries, et cetera related to in this case government public sector and AI and also what have you seen in other jurisdictions that's either worked or has a work that we can learn from and should be looking to.
00:11:07.000 -- I wonder in this particular meeting if it might be helpful and I apologize for putting folks on the spot to do just a quick kind of overview of the governor's executive order and where we're at and anything that y'all would like to know about that just to kind of ground us before.
00:11:22.000 -- we open up.
00:11:23.000 -- Yeah, that's great.
00:11:25.000 -- Oh, and thank you.
00:11:25.000 -- So some of you might know government's lead issued an executive order back in January on generative AI.
00:11:34.000 -- We borrowed a lot of the content for that from California and their executive order in the spirit of it.
00:11:34.000 -- If it's good, you know, we wanted it.
00:11:45.000 -- It's not, it's not plagiarism is an homage something of that nature, but really I do want to turn it over to Nick the state's technology officer as he really he and Katie Ruckle are chief privacy officer really refine the EO made recommendations about which would be in the EO, which would be out of the EO and are the wonderful task of really implementing the EO.
00:12:11.000 -- I will say before I turn it over to Nick just a couple things about the EO everything the contents of the EO that Nick's going to discuss are on or super tight timeframe, which I think serves this task force really well.
00:12:23.000 -- We have several deliverables that we wanted to get done before the end of governor is Lee's term and the end of the demonstration.
00:12:30.000 -- And we thought about knowing that this task force that the AG's office had already introduced the bill to form the task force like what what could we do to help inform this task force.
00:12:39.000 -- So I think they work together really well.
00:12:39.000 -- I'll let you all decide once you hear what's in the EO how well you think it's works.
00:12:46.000 -- But Nick if you don't mind and Samantha if it's appropriate just dive in a little deeper about what we're asking our cabinet agencies to do and want to be really clear about that that in an executive order if you didn't know the governor can only direct his cabinet agencies to do stuff.
00:13:03.000 -- So in this EO he's got several agencies he's asking for several deliverables and Nick take it away.
00:13:11.000 -- All right, thanks Sherry.
00:13:11.000 -- So a couple of things that are very important to note the executive order is exclusively focused on generative AI.
00:13:11.000 -- So I think many of us know the AI as a discipline has been around since the 50s.
00:13:28.000 -- Generative AI is kind of the the happiest form of it currently.
00:13:28.000 -- And so, and with that comes a lot of speed.
00:13:38.000 -- And so, it pays at least the evolution of the industry and the evolution of the capability of models also continued kind of evolution of identifying what the risks are and wanting to be understand what kind of controls we have and need to deploy.
00:13:57.000 -- The other really important dimension that we can't lose in any kind of technology is the people dimension and so that's how does it impact our staff our customers.
00:14:08.000 -- How can we use generative AI in any other technology to help more traditionally disadvantaged or marginalized communities.
00:14:15.000 -- And so these are kind of the themes that you would see you'll see in the executive order and when I've done giving an update I can drop a link to the executive order in the chat so I got it.
00:14:26.000 -- Perfect.
00:14:26.000 -- Thank you, Sherry.
00:14:28.000 -- So, as Sherry mentioned, I'm going to kind of give an overview of the deliverables by timeline, not necessarily in the way that they're structured in the executive order.
00:14:39.000 -- The first set of deliverables are due in September, so as you can imagine, we are furiously working to finalize them.
00:14:47.000 -- And there are three key deliverables.
00:14:47.000 -- One of them is a report on generative AI initiatives within state agencies and state government.
00:14:47.000 -- So think of this as the, what are the use cases we see lots in the news or.
00:15:03.000 -- And on social media and other places around what people think generally I mean if you like watch the Super Bowl or halftime commercials that talk about the dreams of generative AI so trying to distill all that down to like what are the functions of government that can be most enhanced.
00:15:16.000 -- Some of the ones that we talk about that you'll see in the report when it's published or around translation language translation around making it easier to access information as an example for.
00:15:28.000 -- If you're a case worker and you need to access a large amount of policies and information, those kinds of types of use cases or where we see a lot of opportunity and so.
00:15:38.000 -- There's a lot of things that we can say about what the use cases look like a general description of how generative AI fits into the larger ecosystem of AI machine learning and deep learning.
00:15:52.000 -- And then timeline on how.
00:15:55.000 -- specifically as charging the executive order, how we can establish a sandbox or environments for agencies to be able to safely experiment with some of this technology and align it with their business strategy or the business outcomes or a better way to support their customers or their staff.
00:16:11.000 -- I only went into such detail because I'm very involved in writing that report I promise I won't go to that level of detail and all the other deliverables.
00:16:19.000 -- The second deliverable that's also do it September our initial guidelines for procurement so many of us understand that one of the one in the process of deploying technology and supporting Washingtonians procurement is a very important way that we control risk.
00:16:35.000 -- And so we make sure that we're efficient and effective stewards of the residents data and up our resources that we use in the state of Washington and so initial guidelines for procurement those are also do smoke again all of this focused on generative AI doing September.
00:16:52.000 -- The third deliverable it's doing September is the is led by the Office of equity and that's an accountability framework and so I mentioned earlier.
00:17:01.000 -- We really need in all aspects of government and technology need to be focused on how we are going to minimize the harms and make sure that we can provide value and we're helping those British and marginalized communities.
00:17:16.000 -- But the Canal health issue they're in several years there were always a lot of like installation of that kind of thing so that people biometrics, they didn't always train their models on people who may be looked like me.
00:17:38.080 -- And so there are those kinds of harms and like those are the things that we really need to make sure that we have a framework to catch.
00:17:43.640 -- And so that's what the opposite equity part of what the opposite equity is working on.
00:17:48.120 -- So I'll jump to the next wave of deliverables and that's going to be in December and there we've got three.
00:17:56.400 -- So guidelines on the impact of adopting Gen A on vulnerable community.
00:18:00.880 -- So you think of the opposite equity as the framework that guidelines are kind of more specific and those are intended to work together.
00:18:08.440 -- We also in December have guidance on risk assessments specifically for high risk use cases.
00:18:15.600 -- I think that's one thing that I am.
00:18:18.000 -- I and many others in the technology community or in the public sector are very concerned about.
00:18:24.920 -- We, WATEC, gave some issued some guidelines on automated decision systems.
00:18:30.320 -- I think that would have been a couple of years ago last year or the before.
00:18:34.440 -- And so that's a good example of a specific kind of way of thinking about the deployment of technology or Gen A and AI is looking at a risk based approach.
00:18:43.240 -- And so as an example, automated decision systems that make decisions that may impact that an individual is one way to look at that.
00:18:51.080 -- And so there are other types of high risk use cases.
00:18:54.080 -- And so that's one area where we're writing additional guidance.
00:18:58.160 -- And then OFS, I mentioned again on the people dimension, we need to make sure that we are supporting and equipping our state workforce.
00:19:06.240 -- And so the OFM has a report in December focused on the impact of Gen A and there are other places where we talked about this before and really focusing on this, how this technology can augment and empower our workforce and not replace our workforce.
00:19:24.120 -- And that's a really key kind of focus.
00:19:27.360 -- And something we've talked about in our work group.
00:19:29.360 -- And then again, that report is due in December.
00:19:31.680 -- And then the final livable final three if you keep account are in January.
00:19:35.800 -- So a training plan for state workers, again, on the people dimension, kind of think of that in two ways.
00:19:40.400 -- One is what is this going to look like on our impact of the workforce? And then how are we going to train and equip our workforce in using this technology? We've got contract terms and templates.
00:19:50.920 -- So our initial guidelines for procurement are from WATEC or due in September.
00:19:56.080 -- So think about kind of the way that those can be implemented.
00:19:59.160 -- Like what are the actual terms? What are the what are the contract templates that agencies can then pick up and use in the assessment or the use of gender and AI technology? Those are due by the Department of Enterprise Services in January.
00:20:11.800 -- And then the last one is for the workforce board.
00:20:15.640 -- And that's just like many other types of technology.
00:20:19.360 -- We need to be focused on the pipeline and where a skill sets and meet wolves.
00:20:24.960 -- You see this in other kind of advanced more complex areas around data analytics and a lot in the data field.
00:20:31.440 -- We're finding that we don't always have, we have a huge opportunity to be able to invest more and figure out how to upscale people and earlier in that before we get to the place where we have people who are working for the state or who are working to serve our public servants.
00:20:48.920 -- There are ways that we can figure out like other partnerships, other programs to upscale individuals until that is the other deliverable that's due in January.
00:20:58.360 -- And Sherri, I think that covers all of them.
00:21:01.240 -- If I missed any or if I need to go into detail on it, please let me know.
00:21:04.840 -- No, you definitely covered it.
00:21:05.880 -- I'm curious if folks have questions because that's a, it's a lot of short amount of time.
00:21:11.560 -- I don't have to take that, but.
00:21:13.320 -- Maybe, maybe not a question, but one word you use, Nick was a harms.
00:21:19.160 -- And I've been in various roundtables and someone characterizes as, you know, the horse out of the barn or a stampede.
00:21:26.440 -- Sometimes the horses in the cattle go in one direction and the fear is that there is some kind of structure that establishes some kinds of parameters to ensure that there's a discipline with the use of these technologies.
00:21:41.080 -- And so as a cybersecurity guy, the ease with which people can be naughty has only increased.
00:21:47.480 -- And so at the roundtable, it was the show of hands, almost 30 people said, we've got to have some parameters to ensure that that harm and yet word that you use stuck out with me.
00:21:58.680 -- So these are great tools.
00:22:00.680 -- Everybody's just running as fast as they can to grab a platform with a level of excitement.
00:22:06.120 -- And and one person said, maybe we need to just sort of ease back throttle back and just get our arms around this so that it's a wonderful tool, but properly managed to avoid that harm.
00:22:17.640 -- So that sort of stock on my mind more than anything else.
00:22:20.840 -- So I'm glad to be part of this group.
00:22:22.680 -- And you're probably here more than you like about me making sure we don't let the naughty people inappropriately mess with the, the tool this here's day.
00:22:31.320 -- I think that does bring up a very important principle.
00:22:35.160 -- Regardless of generative AI or any other kind of technology is maximizing the use of your existing processes.
00:22:42.120 -- So generative AI is a new capability and it has a lot of new capabilities compared to other and generate new content, but it is a form of technology.
00:22:50.200 -- And we have processes designed in like security that you mentioned or privacy, you're designed to identify controls and measure risk before we deploy something.
00:22:58.920 -- And so augmenting those, those processes instead of trying to build something completely brand new is a really important principle that we're really anchored around.
00:23:07.240 -- I support that.
00:23:08.920 -- Yeah, trying to focus out.
00:23:09.800 -- No, it's great.
00:23:10.360 -- Great highlight.
00:23:10.920 -- Thanks, Steve.
00:23:17.320 -- Let's start in this and up before the toys.
00:23:22.840 -- It's just a quick question, Nick.
00:23:24.360 -- And you probably thought about this way more than I have.
00:23:27.720 -- Have there been any discussion or thoughts surrounding any of these reports and deliverables around AI threat detection? Because I know there's a lot of work going on in that field and I am not sure that that we're starting to think about that for state agencies and public entities.
00:23:47.480 -- Yeah, we have a lot of AI threat detection current current capabilities.
00:23:51.480 -- I think about what we do for fraud detection or for cyber security, cyber like for vulnerability mapping.
00:23:56.920 -- I do think, and I would have to validate.
00:24:01.240 -- We have a set of use cases that do focus on cyber security, but in generative AI in particular, they aren't quite as like when we survey kind of our community of practice.
00:24:16.120 -- I think it was in the list, but it wasn't one of the top five in terms of making information accessible and translation, those kinds of things.
00:24:24.040 -- But that's, I think, and there's also with the flip side of that, generative AI capabilities we're also seeing are being used to increase the threat profile.
00:24:34.600 -- So we kind of know about phishing emails and the things you traditionally train for to look or in phishing emails.
00:24:42.920 -- Generative AI capabilities on the bad actor side can help make that harder to detect.
00:24:48.200 -- And so we'll need some counter measures for those kinds of capabilities as well.
00:24:51.800 -- Yeah, I guess I should have been more specific because that last person was exactly where my thoughts were going is phishing and all of that is going to be much more believable.
00:25:04.600 -- Yes, the other is something that we have talked about and we're looking at as well.
00:25:08.840 -- Great.
00:25:08.840 -- Thanks, Claire.
00:25:09.480 -- Yeah.
00:25:13.000 -- What's way up? Hi, thank you so much, Ellen, from the chance to talk.
00:25:22.360 -- So what I'm hearing is that, you know, the focus of the executive order was, you know, you just say, I think generative AI as a pathway to having a very deep, good, you know, comprehensive conversation about what AI is in general.
00:25:47.240 -- So like I think somebody mentioned like AI has been around since the 50s and technically we use artificial intelligence.
00:25:55.320 -- So we use it technically every day.
00:25:58.600 -- And now the way that we're using AI is more state of the art versus the way that it has been used and the way that it has been used is for those large language model, prompt writing, generative AI processes.
00:26:14.440 -- Now, there are so many other aspects of artificial intelligence and machine learning that come along with the large language model and generative AI.
00:26:27.000 -- But that's more like theoretical, like what people know about, it how people interact with it day to day on a user to, you know, platform basis.
00:26:40.520 -- So the parts that I think that we're going to have to start focusing on, we're going to have to kind of put the generative AI kind of mantle off to the side here a little bit.
00:26:51.560 -- And kind of one of definitely start shifting focus to the parts of AI that are involved in robotic process automation.
00:27:02.280 -- And I think our Washington CTO mentioned that.
00:27:06.600 -- And that's exactly what we need to be focusing on, expanding on, storing that up because with AI, we want to start using it for automation, automating the things that we don't need people to do like sending emails or sending reminders about, you know, one thing or another.
00:27:26.600 -- We need our people to do the people work and keep them focused on those things that only people can do.
00:27:31.320 -- Because at the end of the day, on the end of any AI software that you use, there's a person there who created it.
00:27:37.960 -- There's the person there who is monitoring it.
00:27:43.240 -- There's a person there who's updating all the coding, doing those things in ways that happen almost instantaneously.
00:27:49.560 -- And there's people there that need to do that.
00:27:54.040 -- So we, you know, I would suggest maybe looking into more of what robotic process automation we could do within the state, business, government, public, and private alike.
00:28:09.720 -- What are we doing about authenticating your users? Because I don't know if you all know, and if you're on social media as much as I'm on social media, many of the accounts that are part of that platforming ecosystem are fake.
00:28:24.520 -- They're dupes, they're meant to create a just to scam.
00:28:35.560 -- And so can we do anything about those individuals who use the platforms that are already out there the way that they use them to data, you know, broker our data? Well, that's, we've already tried that and it doesn't seem to be, you know, any bells with the tech companies.
00:28:52.520 -- But what we could do is we can focus on artificial intelligence that specifically identifies who's a real person and who's not a real person.
00:29:05.880 -- It's called KYC, know your customer, know your business, you know, whatever it is.
00:29:12.200 -- So we're focusing on type of AI that confirms that you're a real person.
00:29:18.360 -- Or you're the person who's there who's authorized to do business or engage in this transaction, take this class, you know, get this ID, what have you.
00:29:24.200 -- So those type of things, I don't know if any other person on the call here has any other ideas about what to look for as when it comes to the most cyber secure, secure by design, you know, being obsessed with risk.
00:29:41.400 -- What are the risks here? What is the harm here being proactive? You know, letting robots do things are good, you know, depending on what your industry is.
00:29:57.960 -- Can you use like actual robots or you're using like physical or you're using automated vehicles or you're using a robotic process created through software.
00:30:15.560 -- So that's kind of where me as the private citizen and a person who's who actually does prompt writing for large language models.
00:30:21.640 -- That's one of the tasks that I do as an independent contractor.
00:30:27.720 -- I write prompts for large language generative AI models.
00:30:27.720 -- And that's a whole different realm from what we need to do working in the field every day.
00:30:37.160 -- That's theoretical.
00:30:37.160 -- What we need to be doing is looking at the practical applications.
00:30:42.920 -- So I don't know if anybody else has any thoughts about what those practical applications may or may not be.
00:30:48.120 -- Thanks.
00:30:48.120 -- Oh, sorry for the screaming kids.
00:30:55.160 -- You can hear them.
00:30:55.160 -- Sorry.
00:30:55.160 -- Maybe I can comment when I grabbed a hold of the word that Nick used earlier about the harms.
00:31:02.040 -- It's surprising a number of the top cyber security firms are actually engaging in private joint ventures to figure out how they as in a collective way and figure out how to add some products that they don't have to determine that fake entity that we were talking about in recognizing that none of their current software offerings are that swift in making sure and I keep using the word naughty.
00:31:32.760 -- The naughty people in disseminating fake information and it is surprising they recognize some of their best software programs from their companies don't work.
00:31:40.360 -- So they're trying to seize upon what other companies have and peculiar joint ventures amongst some interesting competitors are being discussed in private ways.
00:31:52.520 -- So that thing that you just talked about, fake information.
00:31:59.640 -- And once we turn over a lot of support services to that quote robot, we really have got to ascertain how correct is it? And so people keep talking about a wonderful tool that this is Steve who's worried about cyber security from day one.
00:32:13.640 -- We really need to figure out just how good is that information that's generated because when you've got the top cyber firms saying they really don't have the tools that's indicative of some alarm at my level.
00:32:26.120 -- So I'll in with my continued speech about naughty people.
00:32:32.280 -- Well Steve, what I'll add is that there are companies already out there who can help with those types of infrastructure type of things.
00:32:47.480 -- And you know what we talked about like these cyber security professionals, they're working with their own software.
00:32:53.000 -- However, there's companies like IBM.
00:32:53.000 -- IBM wants an ex has a good program.
00:32:59.400 -- I know that Lumet technology has a good program.
00:32:59.400 -- And these head companies versus cyber security companies, these tech companies have the technology and the staff and the designers and all they need to create this very secure like unhackable, like literally 100 percent unhackable.
00:33:24.360 -- So the software itself is unhackable.
00:33:24.360 -- But what you have to think about and I think this is might be something maybe a blind spot for some of the cyber security firms is that it's not necessarily the AI that you have to be worried about specifically causing pharma, it's the people on the inside of your company who have access to do the harm.
00:33:44.520 -- So that also needs to be a very clear focus area for every business, anybody who employs somebody else.
00:34:02.200 -- And so like IBM Watson, I think a center does something along those lines.
00:34:02.200 -- There's plenty of tech companies who are already out there who have been doing the work and they're streamlining themselves with AI so that they can help other businesses streamline themselves if you know kind of know what I'm saying.
00:34:20.360 -- So yeah, I agree with you.
00:34:20.360 -- If it's certain companies have those technologies, but if you walk door to door with companies, it's amazing how many companies don't have the best cyber security or this technology you just referred to, there's thousands of companies that aren't that swift with it.
00:34:37.000 -- But we're on the same sheet of music.
00:34:37.000 -- I agree with you.
00:34:44.040 -- I definitely appreciate the conversation, Steve, and LaToya.
00:34:44.040 -- It's like fascinating.
00:34:44.040 -- We could spend a lot of time on that topic and that topic alone.
00:34:49.480 -- I know somebody else had their hand up and maybe they've given up.
00:34:56.120 -- Marty, what's that? I think Marty had their hand up.
00:34:56.120 -- Oh, thank you.
00:34:56.120 -- Oh, hi.
00:35:06.040 -- So kind of pivoting into some areas that I'm not sure under the executive order, but just to be another sticky on Nick's wall behind him over there is we were having a conversation the other day in a DC way off about where we could end up going under the public records act with some of these AI capabilities.
00:35:31.400 -- Would we be required to use AI to create certain types of records that we don't currently create, but we have the data to do it.
00:35:40.760 -- We just don't have the capability to do it.
00:35:48.120 -- So it's a very good high-room topic there.
00:35:48.120 -- But as I say, it's a sticky to go on the board is that this has the potential to really oh, you're so cool.
00:35:55.880 -- Nick, it could really turn that table quite a bit in terms of how we currently provide transparency and access to public records.
00:36:12.200 -- So it's just something to be thinking about.
00:36:12.200 -- So that's all I had.
00:36:12.200 -- No, I appreciate that, but I know I know enough to be dangerous about the Public Records Act too, but one thing that's always stuck in my mind and I can't on my friends at the Attorney General's office to correct me, but if you don't have to create something new to respond to the Public Records Act, if it doesn't exist, you don't have to provide it.
00:36:31.880 -- So it's just that it's until that judge comes along and says you should, because you could have.
00:36:37.640 -- And that's the problem with the Public Records Act.
00:36:37.640 -- It's not a static law.
00:36:45.880 -- It really depends a lot on what the last judge says.
00:36:45.880 -- So it's just again something to really just keep in mind as we go on this journey.
00:36:54.360 -- Yeah.
00:36:54.360 -- And I will say just on that in particular, that's one of the use cases that we've talked about as within WATEC, we also manage the open data program.
00:37:05.080 -- And so if you think about the opportunities that we have to make our open data more accessible and more like useful to members of the public, that is it's not specifically talking about the Public Records Act, but around that intent of being more transparent, providing more kind of actionable, consumable information to our residents.
00:37:22.440 -- I think there are some opportunities in some use cases that we've been talking about and looking at within WATEC.
00:37:27.160 -- There is also the flip side of that around the use of generative AI is a prompt or record is an output or records.
00:37:32.120 -- We put some of those things in the guidelines and that's another.
00:37:37.400 -- And again, following the Public Records Act and the use of technology, it's some of those components are pretty clear.
00:37:41.720 -- But that is another important point we've talked about.
00:37:47.400 -- Thanks for bringing that up, Marty.
00:37:47.400 -- We can certainly bring our AAG experts on the PRA into providing some information and have them think through.
00:37:54.840 -- But my thought Sherry went to exactly what you said.
00:38:01.640 -- If we don't have a responsibility to create records that don't exist, just to have a record.
00:38:06.200 -- So Ellen, that about sums it up for the the EO and will be that going and hopefully some of those deliverables and I guess that you go, I was thinking it went without saying, but I'll just say it.
00:38:21.720 -- Of course, the deliverables as they are delivered will be available to not just a subgroup at the entire task force to the extent they can be helpful.
00:38:36.360 -- Is there anything else? You were hoping that we would cover Ellen or were you having suggestions on where we should be going from here just based on how the other groups have.
00:38:42.760 -- I think the other groups have gone into sort of a more open discussion about about hopes and dreams, what's keeping you up at night, et cetera.
00:38:55.720 -- But before we do that, I just really want to applaud Nick and Samantha and Sherry and all of those who have been working on that because that is a monumental task.
00:39:06.840 -- And from on our side, and when we write reports of like the difference between a December deadline and a January deadline is obviously there's there it's not so well done.
00:39:11.960 -- And looking forward to to going through that and all of the work that you've put together.
00:39:18.200 -- Yeah.
00:39:30.040 -- Well, let's go ahead.
00:39:30.040 -- I think Cliff was first actually.
00:39:30.040 -- Cliff was just clapping I think.
00:39:35.880 -- Oh, Cliff was just applauding.
00:39:35.880 -- I had just applauding.
00:39:35.880 -- Yeah.
00:39:35.880 -- Yeah, I just I guess my question is as we move forward, how can the subcommittee be the most helpful? You know, as we kind of move forward within these deliverables that you guys have kind of outlined and also as you guys go back to the bigger group, how can this group help and inform? Yeah, and I think you talked about that a little bit yesterday about in the privacy and consumer protections group about the bylaws that the test force is currently working on, which I think will really inform how these subcommittees work.
00:40:23.080 -- But I'm very happy for the discussion to inform those bylaws.
00:40:29.240 -- So yeah, I think that's going to be a question I'm noodling on for for this of committee and all of them as as we go forward.
00:40:38.840 -- But for this one in particular, given the work that that is happening now and the limitations that Sherri alluded to of an executive order of that that's the governor's office in the cabinet agencies, but doesn't cover separately elected officials, doesn't cover local jurisdictions, you know, what lessons or what what things can we take from all of the the great work that's happened and help apply them more broadly where maybe there's just hasn't been the resources, etc.
00:41:13.000 -- And how can we continue to build on them? That's I guess the most immediate thing and I think as we develop more of a comprehensive work plan, I think we'll have clearer answers to that.
00:41:26.920 -- So stay tuned a little bit.
00:41:26.920 -- Rick.
00:41:36.520 -- Thank you.
00:41:36.520 -- I guess one of the things I wanted to add if I could, maybe it's just from the experience that we've had in the state governor's office.
00:41:41.880 -- I'm trying to lower my hand now as I'm talking.
00:41:49.400 -- So I'll just leave it out there.
00:41:49.400 -- But, um, honor of McCarthy put together a task force for the office over a year ago and and chasmed me to leave that effort up.
00:41:56.600 -- And so we've been going at this work for over a year.
00:42:04.440 -- And when the governor's executive order came out, of course, we took that and made sure that we were modeling what we were doing so that we would be in line with what the other state departments were all going to be doing.
00:42:17.640 -- And so that's been our focus.
00:42:25.960 -- But one of the things I think that we experienced and continue to is making sure that our focus is in the areas that we actually have the ability to have an impact on.
00:42:39.480 -- One of the things we all know who are on this call right now is that in this conversation and in this space if you will, you can really go down a rabbit hole really quickly.
00:42:46.120 -- And all of the things that we talk about are very important.
00:42:52.680 -- But if we lose ourselves talking about things that larger bodies, if you will need to conquer those problems, we lose our opportunity to focus where we actually can.
00:43:06.920 -- And for the auditor's office, one of the things we've done as we've stood up an interior testing group that are we have some about 29 employees who are building test projects.
00:43:22.280 -- And so we're going to have to methodically and probably slow more slowly than many would like.
00:43:29.240 -- But we've got to get it right too.
00:43:29.240 -- And we know that in the end, even when there are products that are available off the shelf so to speak, the way we do our business and the way we interact with the way others do their business makes nothing really off the shelf.
00:43:42.440 -- We really have to be deliberate about what we do and where we do it.
00:43:48.600 -- So again, I guess I don't want to be a buzzkill if you will.
00:43:56.600 -- But I do want us to make sure that as we go forward, we are our expectations and that we keep our focus if we can and find ways to maybe play something in a parking order, in an advocacy area, or something where we may not have the direct ability to have that impact that we want.
00:44:19.400 -- But we still want to be able to advocate on behalf of.
00:44:24.200 -- So that leads me to just a logistical question that Ellen, maybe you can answer is what is, and you, I think you said that's at the first meeting of the task force, but what are you imagining that the cadences for the subgroups? I think there are 11 right now.
00:44:40.680 -- So monthly, I think is ideal, I don't think we have the capacity and the staff side to support more frequent than that.
00:44:57.960 -- But I also think in order to keep up the conversation with the urgency of the issue that we need to try to aim for that.
00:45:04.040 -- The other thing I'll note is that we also have a cybersecurity and state security, I think it's called subcommittee, and I believe Sherri and Rick are the designated for the moment at least.
00:45:14.680 -- I think so we'll have a conversation about that and how these things together because we will have a conversation about that.
00:45:20.520 -- That's my notes.
00:45:20.520 -- I did not make that, I did not make that up out of the air.
00:45:28.280 -- That's what my notes said.
00:45:36.360 -- So yeah, I think the subcommittee will be evolving fairly quickly as we get into it and figure out what makes sense and goes together.
00:45:42.200 -- That's a long way of saying monthly, it's the answer to your question.
00:45:51.240 -- Well, one thing I'll just offer as time goes on and this starts to flesh itself out is we're starting a couple projects with two counties, one in Cook County where Chicago is and one in Douglas County in Nebraska where Omaha is.
00:46:04.920 -- And we're going to be learning a lot of things about those states and I'm happy to bring that back to this subcommittee and share kind of what we're learning.
00:46:09.720 -- We're just getting started with those two and so it's good timing.
00:46:13.720 -- Great.
00:46:13.720 -- Well, as this morning you talked about the trip to Oklahoma and what you learned there and I wonder if there's anything that comes to mind for this discussion that you'd like to offer.
00:46:26.840 -- Yeah, so this was now this is the beginning of the summer.
00:46:34.360 -- Senator Joe Winn and I went to Oklahoma to talk to their state legislature about the AI task force.
00:46:40.840 -- So they were last summer it was stood up and then they reported back to the governor in January.
00:46:46.360 -- So they went pretty fast and they're kind of done now.
00:46:46.360 -- And so a couple things came out of that.
00:46:52.280 -- One legislation that went in front of the house, it didn't pass but it looked, it was way better than they thought it was going to be in terms of the votes that they got.
00:46:57.960 -- So they're they're going to refine some things and it's going to go back in front for the next session.
00:47:05.560 -- And yeah, there was a lot of interesting use cases around traffic and some things related to energy, stuff that wasn't predictable like, you know, it wasn't all just internal operational efficiency and generative AI related stuff.
00:47:18.840 -- You know, there was some computer vision things that were happening that were related to potholes and uncovering those kinds of things for the government.
00:47:28.360 -- There was just, I was surprised to hear the breadth of the things they were looking at.
00:47:28.360 -- Now, the other big part of it too, and I think the reason why things didn't pass this session was there's still a lot of fear and there's a lot of risk, you know, and some people want to take it really slow and be careful.
00:47:42.920 -- And the last thing I'll say is that we were surprised that after it was delivered to the governor in January that not a lot happened in the six months after that, it really slowed down.
00:47:55.240 -- And so I think we'll be playing paying close attention to what actually happens after they delivered them.
00:48:17.480 -- It does bring up an interesting question.
00:48:17.480 -- I don't know if anybody has the answer to.
00:48:17.480 -- I mean, I know a lot of states have certainly introduced legislation around artificial intelligence, some regulating public use of it, some trying to regulate private use of it.
00:48:25.720 -- I'm not, I mean, I know other people have a better handle on what's actually happened.
00:48:31.560 -- But that would be a, I don't know if there's any, I know what, what am I trying to say? There's different organizations that do summarize like legislation like NCLS, for example, they always do a kind of a deep dive at the end of it every year of what got dropped, what actually passed, what does it do? Do we have any or does anybody have that kind of synopsis available so we could kind of use that maybe as for some ideas on what we want to consider in terms of our recommendations? I raised my end for a different reason, but probably I think, so the future privacy form does a pretty good job of capturing AI related legislation.
00:49:13.960 -- So I'll talk to Katie.
00:49:13.960 -- There are also some other members I could talk to NACIO, which is Association of State CIOs and see what they have.
00:49:28.120 -- Well, those are a couple of places that are, that at least we'll have a picture.
00:49:28.120 -- There are some folks that I work with in the University of Berkeley, they do a good job of capturing the California perspective and then they have a database of some other relative.
00:49:36.920 -- So I'm sure we have the data somewhere just at this point, just like all data is in a couple different places.
00:49:41.560 -- So I'll see what we can do and kind of bring back the best national picture.
00:49:46.920 -- The other other thought that I had, Ellen, when you were talking earlier, I mentioned so the executive order is good.
00:49:58.360 -- It's focused on generative AI, I kind of gave the summary there.
00:49:58.360 -- I also mentioned that Katie, Rucka, and I lead the AI community of practice.
00:50:02.600 -- In that community of practice, we started that before the executive order and that's really focused on all forms of AI.
00:50:07.560 -- We have a couple of of subcommittees, but the important point is that it includes many members of different types of local government and higher education.
00:50:19.000 -- So one of the things, it means it every month.
00:50:19.000 -- So one of the things I thought I could do is see if what kind of key themes, like we have a use cases subcommittee, we have one focus on policy, we have a subcommittee focused specifically on like what are local government challenges.
00:50:33.480 -- So I will see what kind of themes we can bring back that we're identifying and maybe just give a quick summary of what we're looking at in that group and see if there's some things that resonate with the task force that could be elevated and that we could help address.
00:50:50.520 -- That's fantastic.
00:50:50.520 -- Thanks Nick.
00:50:54.360 -- Yeah, thank you.
00:50:54.360 -- And so just building on this great conversation and thinking not just about this subcommittee, but about all of the subcommittees, I think it would be extremely helpful for us to have kind of repository where we can accumulate kind of links to the different types of legislations that have in past in other states and other resources that we have been discussing.
00:51:16.600 -- And again, I think that would be helpful for this subcommittee, that would be helpful for all the other subcommittees to have something like that.
00:51:19.480 -- So we have a sense of what else is happening.
00:51:24.200 -- So just encouraging us to have something that also spans with the other groups.
00:51:28.840 -- And to tie on to that, Magda, Alexicon of definitions, because it's way beyond a glossary or anything like that.
00:51:38.200 -- So something like that would be helpful.
00:51:38.200 -- So we all speak the same language.
00:51:47.640 -- Roger that.
00:51:47.640 -- We'll have to figure out tech options.
00:51:47.640 -- I know we've had issues with SharePoint with folks outside of the agency.
00:51:52.760 -- I think we've got some particular considerations as like the states law firm and around there, but very much agree that that's useful.
00:51:57.720 -- So we'll figure out how to make that happen.
00:52:05.720 -- And hear you loud and clear on definitions, Cliff.
00:52:05.720 -- Steve, go ahead.
00:52:13.880 -- Yeah, someone mentioned, I was trying to track it.
00:52:13.880 -- You've been looking at something developed by California and that may be what others have talked about here.
00:52:19.640 -- Is that a complete statute or what have they done that you looked at from a parallel standpoint? I may not miss the whole conversation, but somebody when we were talking about correlating this, our effort here in Washington, it was some commented about what California has released.
00:52:38.120 -- So I, oh yeah, that was me.
00:52:38.120 -- So there's two two parallels that we mentioned so far.
00:52:44.040 -- One was Sherry mentioned earlier.
00:52:44.040 -- There's a lot of similarities between our executive order and the executive order at California on January.
00:52:52.520 -- And then I also mentioned that I have had the opportunity to work with some researchers in the University of Berkeley who have been focused on, they keep like a database of California related legislation that focuses on AI.
00:53:02.360 -- It's only focused on one state.
00:53:02.360 -- It doesn't give us a national perspective.
00:53:06.120 -- But it is a collection of information.
00:53:06.120 -- So that's the other correlation.
00:53:21.960 -- My lost track of who raised that this does not give you that order that people raise the hand.
00:53:31.080 -- So I don't know who raised their hand first.
00:53:31.080 -- Somebody else does.
00:53:36.760 -- Go with the shui go with Latoya.
00:53:36.760 -- The way they're showing up is Latoya Leslie and then Rick and my screen.
00:53:41.320 -- Awesome.
00:53:43.400 -- Yeah, so the interesting, you know, hearing, you know, the conversation of the task force subcommittee of government, you know, and policy and hearing, you know, how these measures to regulate AI have failed in legislatures all over the country.
00:54:01.800 -- And I want to, I don't know if people are asking the right questions or people are looking at, you know, what is it that we should regulate? So one of the thoughts that I had and, you know, just at the end of the day, the reason why there's so many, you know, software companies out there, you know, doing software and cybersecurity and IT and all that is because AI as it stands now is open source.
00:54:37.640 -- So that means anybody can use it.
00:54:37.640 -- Anybody who is sad enough to learn, basically, you know, not to be facetious, but anybody on their mom has access to open source AI.
00:54:45.560 -- So for me as a person who is very interested in privacy and security, you know, what's public versus what's private.
00:55:03.880 -- So basically open source means it's public information.
00:55:03.880 -- So anybody who wants to use AI really any way at this point and use it, no restrictions on a who can, what you're using it for, no nothing.
00:55:20.200 -- I think that's a huge problem, but that's just me personally.
00:55:20.200 -- So there's a difference between AI that's open source and anybody has access to you.
00:55:25.720 -- And instead, making a recommendation for making AI closed source, that means that only, you know, authorized people who have a need to know or need to access to the algorithms, to the coding, to the networks, to the cloud computing and the data centers, the only people who get access to use that to create anything technical related has to be authorized.
00:55:58.280 -- So basically licensing AI, you know, we do it with other softwares now.
00:56:06.680 -- But that would be when it comes to like how do we systematically look at things to help just kind of prevent these bad actors from being able to use as high-tech software in any way they please because they will and they continue will will continue to do that as long as we can.
00:56:26.680 -- So I as a private citizen and an average user, a person who has been, you know, a victim of identity theft and fraud and all kinds of online criminal activity just because I'm on their own a lot.
00:56:40.040 -- That's what I would want.
00:56:40.040 -- Like if I knew that there'd be a way that the government can say we're going to limit who has access to this type of AI generative machine learning, all of the features of AI instead of allowing it to just be open source and be like anybody out here can use it.
00:56:58.920 -- So that's what I wanted to say about that.
00:56:58.920 -- Thank you.
00:57:07.480 -- Thanks Latoya.
00:57:07.480 -- Lastly.
00:57:07.480 -- Yeah, thanks.
00:57:07.480 -- So I noticed that of the sub-book committees that are listed on the AG's website, several of them actually have the words cybersecurity or privacy or public safety and so forth.
00:57:28.760 -- And I just want to raise a flag for this subcommittee focusing at least half on the benefits that state workers could gain from using AI because the risks are so big and obvious that there's a chance that all the subcommittees will focus on that and there won't be anything else to report.
00:57:54.840 -- So I'd like to see this subcommittee define its charge as having something at least to do with how state workers can use AI well and efficiently to improve their work life and the products that they produce.
00:58:12.280 -- That makes a lot of sense.
00:58:12.280 -- Yeah, and that's um, so I know I've already signed myself up for a bunch of homework.
00:58:24.360 -- So I will try try not to continue to do so.
00:58:24.360 -- But in the in the generative AI report that I mentioned, though all of the first sections other than what is it are about its use cases and the value proposition and the tasks that it can help.
00:58:37.000 -- What's the report is finalized, um, like Sherry mentioned, we'll share it with everyone in the task force and task forces.
00:58:47.800 -- But that's a particular section that we could dig into Leslie and talk more about specific applicability and use cases.
00:58:52.760 -- There are also other use cases that will need to identify that don't solely, they don't solely leverage generative AI technology.
00:58:58.280 -- And so we have some of those use cases and some data that we're trying to collect across the state around potential or current uses of AI to improve the services that we provide.
00:59:10.920 -- So that's something else I think we are also trying to strike a balance and would I would love to help if that's the way Ellen that that you would share in all what this subcommittee to go would be happy to help provide any information we have on use cases and benefits.
00:59:28.440 -- Thank you.
00:59:28.440 -- Yeah, I think use cases and benefits to state workers and how to improve that and also to the residents of Washington to make sure they're getting, you know, high quality services and all everything that that means.
00:59:53.400 -- I think Rick, you had had your hand up.
00:59:59.000 -- Actually, I didn't I was trying to type and I think I hit the button so, so thanks.
01:00:17.080 -- Actually, since there is dead air, I'll try to fill it.
01:00:17.080 -- I'll just lend my voice to the idea that we do try to focus again where we can have outcomes and I think we're sure that had mentioned the idea of focusing on the state employees and their abilities to use these tools.
01:00:41.080 -- And again, I just think that's a really good thing for us to continually remind ourselves of is that again, we can get lost if we don't keep that focus.
01:00:48.280 -- And I know within the state auditor's office, we have like I said, a number of test projects already underway that have been created by employees to do that very thing, you know, to take some task and either make it more automated or less human needed.
01:01:12.840 -- And so again, I think we should keep our focus on those kind of things as well.
01:01:20.360 -- Thanks.
01:01:25.720 -- Latoya.
01:01:28.840 -- Yeah, thanks again.
01:01:28.840 -- I actually have this was a second point that I was going to bring up, but it's slipped my mind, but now I've recovered it.
01:01:32.680 -- One of the things that I would put out to the group is maybe getting an idea about how do you do this type of AI work when it comes to government entities in and of themselves and how they then set the standard for the private and public sector.
01:02:00.760 -- And just for the group's knowledge, they're doing that in Dubai in the United Arab and Red.
01:02:09.080 -- So if you're not aware, the leadership there, their royalty, so they decided many, many years ago once the oil boom, half bust happens because they saw that coming for their region, one of the next steps in their strategic plan was to be the AI capital of the globe.
01:02:28.040 -- So they are doing a lot of things there when it comes to AI and integrating into the government.
01:02:35.720 -- And I think it would just be interesting to just kind of see what that process looks like from a non-American government's perspective, because at the end of the day, right, the Royal family at, you know, they are make the decisions.
01:02:57.160 -- However, they are incredibly incredibly worker centered there as well.
01:03:08.040 -- And so every decision that the leadership of their country make when it comes to integrating AI into their day-to-day operations is 100% focused on how do we make things easier for our people because most of their primary citizens are their Emirates.
01:03:26.600 -- So they're people who are part of their royal family or the royalty of that region.
01:03:32.760 -- And then there are people who live there, you know, who live in Dubai business there, but their primary focus is their people.
01:03:41.000 -- And according to them, everybody who lives there, the Emirates and then the other citizens of the country, are their responsibility and they look first and foremost to how does AI impact their citizens and their residents and their leaders themselves.
01:04:00.280 -- So what I love to recommend that some, you know, some members of this group go, you know, on a fact-finding, you know, get an information trip to Dubai.
01:04:15.880 -- If we want to know about how it is to integrate AI into your government, then that would be a good place to start.
01:04:23.480 -- I'm in addition to the task forces that are already out there.
01:04:23.480 -- I would be interested to know besides Oklahoma, what other states have task force that are either have been doing work or about to embark on work like we are.
01:04:37.400 -- Thank you.
01:04:45.400 -- Well, I see some of you have to put a link to some of the work that you were referencing in the chat.
01:04:51.960 -- So that's great.
01:04:51.960 -- And Nick and Ellen, I think you both, between the two of you probably have a fairly good understanding of how many other states have task force.
01:05:03.240 -- I think there's like almost a dozen of them or at one point in time anyway.
01:05:08.920 -- Yeah, I think that's right.
01:05:08.920 -- I've struggled to get a hand on like exactly how many.
01:05:08.920 -- But I think I think about a dozen.
01:05:16.760 -- They have different scopes.
01:05:16.760 -- Some of them are very narrow and some of them like ours are very, very broad.
01:05:21.000 -- One 20-ish.
01:05:29.320 -- At this point.
01:05:29.320 -- Oh, well.
01:05:34.200 -- You keep.
01:05:36.440 -- Thanks.
01:05:36.440 -- I just want to make an observation that I think this task force has an opportunity to be super, in fact, this subcommittee has an opportunity, super impactful of the work of the entire task force.
01:05:49.080 -- In the sense that, you know, a lot of the work that Nick and the team are doing, for sure to this executive order.
01:05:57.160 -- A lot of the expertise that's being deployed by the state employees, people on this call and elsewhere, can really inform.
01:06:03.800 -- It's a lot of the top themes that I think run through a lot of these subcommittees.
01:06:10.120 -- You know, I think about, you know, a description of generative AI, accountability framework, service assessments, impact analysis, you know, equity framework, these things.
01:06:20.600 -- There really are things that run through everything.
01:06:27.960 -- And the fact that the state itself is an actor here and a client slash customer of AI, a very large scale.
01:06:35.480 -- And as a result, we're being, you know, we necessarily have to think through these things.
01:06:41.560 -- Really, I think, you know, provides a focus and energy to the work of this subcommittee.
01:06:50.120 -- I think we're very useful to make sure that the benefits of that, the outputs of that get fed into other subcommittees.
01:06:57.480 -- So, you know, the quality of the conversation today, the thought fulness here, really excited about the role.
01:07:03.080 -- This particular subcommittee can play, in the work of the overall task force.
01:07:17.560 -- What we live up to your lofty expectations.
01:07:26.680 -- So for, for a next step before our next meeting, would it be, I don't know, maybe you have other subcommittees or sub have, um, logistically put a mechanism in place to do this, um, but to collect ideas for what we do want to focus on.
01:07:41.640 -- So we can kind of, I think, keep several people have mentioned it to really narrow our scope and really concentrate and focus on several different areas.
01:07:51.000 -- And I think it would be great to, uh, coalesce around some things, but we need to have those ideas generated first.
01:07:57.000 -- So, and if you, um, how about other subcommittees kind of taken on that, Ellen, so we can copy from them.
01:08:07.800 -- That's a great question.
01:08:07.800 -- The first subcommittees met yesterday.
01:08:13.240 -- So, I think we're already like the, the, I think we're finding a clearer way to do the meetings.
01:08:19.399 -- I feel like they're, we're, we're growing even this quickly.
01:08:19.399 -- But I think the broad ranging discussions are super helpful in kind of what, where people are and what we're starting with.
01:08:30.520 -- And I think the team is going to be doing, I think a lot of engagement with folks who haven't joined in these discussions yet and also a lot of working at putting together work plan.
01:08:43.560 -- And so, um, Sherri and Rick can expect us to be bugging you a lot more, before the next meeting.
01:08:50.200 -- Um, and I think the other thing is we will want to, um, have sort of a regular schedule for like, you know, the second Tuesday of each month or something like that, that will start nailing down for September going forward.
01:09:01.960 -- So it's more predictable.
01:09:06.600 -- Um, but I think we will be able to have a good agenda for September and start zeroing in.
01:09:13.960 -- And that'll be flexible depending on how the conversations go and how all of the work to, to implement the EO comes through and, and what comes from that.
01:09:19.720 -- Okay.
01:09:19.720 -- Great.
01:09:29.880 -- It come in here and you know, I was listening to Rick and Leslie and what resonated with me is we got to be careful.
01:09:34.920 -- We don't take too big of a bite.
01:09:34.920 -- Um, just listening to everybody all excited and, and, and, and, and I am too.
01:09:41.319 -- And I thought boy, we take too big of a bite.
01:09:47.319 -- We may not accomplish as much as we could like.
01:09:47.319 -- And that means orienting what, what few things we can accomplish in this limited amount of time.
01:09:53.720 -- And so that we can, we can clap our hands later and say we, we didn't accomplish.
01:09:59.160 -- We didn't go away the moon, but we, we did a pretty good job of these few things and whatever those few things are, I guess that's our job to figure out what can we put our arms around in, be very successful within.
01:10:09.800 -- Yeah.
01:10:09.800 -- So that's Ellen when you and Sherry talking about kind of what Sherry kind of spring the thought around what the key things we want to focus on.
01:10:18.440 -- And I pulled up the kind of the page, uh, the attorney general's office has it describes the overall work of the task force.
01:10:29.640 -- And then it could be a value for exercise like if a beach in West went through and highlighted the top three things that we think we could focus on.
01:10:35.080 -- Uh, I think that's a good, it, it, it, it, it really describes the entire, the work of the portfolio of the various kind of task force, the task force in the subcommittees.
01:10:46.120 -- But there are some that really, we specifically could look at, um, recommendations on appropriate uses and limitations on the use of AI by state and local governments.
01:10:56.600 -- It then says they end the private sector.
01:10:56.600 -- So, you know, maybe that's area area where we wouldn't dive as much as deep into, but I think looking at these objectives and maybe, you know, one of the top three, I know it's hard to limit the three things that are talking about safety and the importance of a back to Steve's point around what's manageable.
01:11:10.600 -- Um, that may be a valuable exercise, at least to kind of ground us and figure out what are the things that we expect to report up to the task force on.
01:11:19.800 -- Uh, and like Steve said, be able to clap our hands and say that we produced something in alignment with the expectations in, um, in the legislation.
01:11:32.680 -- That's excellent, Nick.
01:11:32.680 -- Um, is everybody, everybody getting from that exercise? Nobody's going to say no.
01:11:41.000 -- I put, I put the link that Nick is referring to in the chat and it's the task force work at the bottom with a big chunk of text.
01:11:48.200 -- Um, I think there's 17 deliverables or something like that.
01:11:56.920 -- Um, yeah, I think that's a great, that's a great exercise.
01:11:56.920 -- And I think, I think one of the challenges for this, and just broadly across the subcommittee is figuring out what the appropriate scope is because this is a big issue and it needs to be narrow enough to be realistic, but I also don't want to make us smaller than we need to be.
01:12:13.560 -- Um, and, and miss things that we could have gotten.
01:12:20.120 -- And so finding the right sort of zoom or focus, uh, that feels like a pond, but I didn't mean it to be.
01:12:27.000 -- Uh, I think it's going to be one of the challenges and I think it's exciting, but I'm weird.
01:12:33.480 -- So, uh, the good kind of weird.
01:12:44.040 -- Thank you.
01:12:48.120 -- I think it gets back to the comments somebody made earlier at the start though is, if coming up with or developing a good set of guardrails that we can produce and, uh, show others, if you will, how to build off of it.
01:13:02.760 -- Uh, because again, this isn't a one stop kind of exercise.
01:13:09.000 -- This thing is going to be growing and, you know, people are going to be doing what we're doing for eons into the future trying to figure this stuff all out and establishing those parameters, freeze people up, but it also protects us from those security type issues that Steve has continuously raised throughout this that we all continuously hear about in our concern about.
01:13:29.960 -- So, I think if again, if we can kind of find a focus, get those guardrails in place and then develop actual, uh, product, if you will, that could be demonstrated and reacted to that gives people assurances that, you know, okay, we can now do these things again.
01:13:50.360 -- It's so theoretical still for so many people that that scariness, I think filled in all the void.
01:13:57.080 -- And so again, I like the idea of the deliverable something that we can show and work off of.
01:14:18.040 -- All right, and we'll get to work on compiling resources for this subcommittee for all the subcommittees and Nick, maybe we'll put our heads together and see if we can figure out which 20 states and which 30 other states need to get it together and get their task force going.
01:14:29.320 -- And just get some good grounding here to build on this conversation.
01:14:47.400 -- Sherry and Rick, would you like to close us out? Thanks for, thanks.
01:14:50.600 -- I guess me, I'm thinking for our next step would be to kind of riff off the next idea.
01:14:58.200 -- Maybe Ellen and Rick and I can talk about like logistically how we solicit those things, whether it's through emails and when we want to compile and buy so that we're prepared for the next meeting.
01:15:10.600 -- So if folks are good with it, we can, the three of us and whoever else Ellen is in your world that we should include, we can work on how that will mechanically do that and then we'll count on having that ready for the next meeting so that we can have a discussion about it.
01:15:27.080 -- Does that sound like a good next step? That's for me.
01:15:31.080 -- Great.
01:15:31.080 -- And for anybody who joined after I said it at the very beginning, if you're, if you're not getting the task force email updates or haven't received any yet, please put your contact information in the chat.
01:15:43.080 -- That way we know we can get in touch with everybody here.
01:15:48.840 -- And we'll get the list updated by the end of the week.
01:15:54.680 -- Yeah, just really appreciate everybody's participation and conversation.
01:15:54.680 -- It's great.
01:15:54.680 -- Very exciting.
01:16:03.720 -- Can I, I'm sorry, I didn't, I know they were about to close and I was like, do I want to ask? But I, do we, does the, does the, does the AGO have specific priorities for their office, keeping in mind so now we're about to come to an election and there's going to be a change there.
01:16:23.240 -- Is there been any rumblings of what the future, you know, AG would want to see when it comes to maybe that, that office in particular and how the task force is working like within, you know, the four walls of, you know, his, his office, anything like that or, or, or is the purpose of this committee to set the priorities for the AG? I guess that should be the question.
01:16:54.280 -- The full task force is, we are implementing, I was in gross second substitute Senate bill, 5838 that establishes the artificial intelligence task force and so all of this is to meet statutory requirements and anything, you know, beyond that, the Yuki will be the AG's representative as a member of the task force and all the work we're doing is to satisfy the requirements of the bill, the law, the statute that feels better.
01:17:27.880 -- I don't know why.
01:17:33.960 -- Yeah, absolutely.
01:17:33.960 -- Thank you.
01:17:33.960 -- I appreciate it.
01:17:41.640 -- I think that the, the, the cap on that comment is that so hopefully we'll be independent of any political wins that may breeze through in November.
01:17:47.000 -- For the moment, we're doing this.
01:17:53.400 -- All right.
01:17:53.400 -- This has been a great discussion.
01:17:53.400 -- Thank you.
01:17:59.400 -- Yes.
01:17:59.400 -- Thank you.
01:17:59.400 -- Yeah.
01:17:59.400 -- Thanks.
01:17:59.400 -- I'm trapped.
01:18:06.920 -- Thank you.
01:18:06.920 -- Right.
01:18:06.920 -- Have a good afternoon, everybody.
01:18:06.920 -- See you next month.
01:18:10.600 -- Right.